Toggle navigation
Home
About
About Journal
Historical Evolution
Indexed In
Awards
Reference Index
Editorial Board
Journal Online
Archive
Project Articles
Most Download Articles
Most Read Articles
Instruction
Contribution Column
Author Guidelines
Template
FAQ
Copyright Agreement
Expenses
Academic Integrity
Contact
Contact Us
Location Map
Subscription
Advertisement
中文
Journals
Publication Years
Keywords
Search within results
(((YANG Zhixia[Author]) AND 1[Journal]) AND year[Order])
AND
OR
NOT
Title
Author
Institution
Keyword
Abstract
PACS
DOI
Please wait a minute...
For Selected:
Download Citations
EndNote
Ris
BibTeX
Toggle Thumbnails
Select
Multiple birth support vector machine based on Rescaled Hinge loss function
LI Hui, YANG Zhixia
Journal of Computer Applications 2020, 40 (
11
): 3139-3145. DOI:
10.11772/j.issn.1001-9081.2020030381
Abstract
(
368
)
PDF
(817KB)(
547
)
Knowledge map
Save
As the performance of multi-classification learning model is effected by outliers, a Multiple Birth Support Vector Machine based on Rescaled Hinge loss function (RHMBSVM) was proposed. First, the corresponding optimization problem was constructed by introducing a bounded non-convex Rescaled Hinge loss function. Then, the conjugate function theory was used to make equivalent transformation of the optimization problem. Finally, the variable alternation strategy was used to form an iterative algorithm to solve the non-convex optimization problem. The penalty weight of each sample point was automatically adjusted during the solution process, so that the effect of outliers on
K
hyperplanes was eliminated, and the robustness was enhanced. The method of 5-fold cross-validation was used to complete the numerical experiment. Results show that, in the case of no outliers in the datasets, the accuracy of the proposed method is 1.11 percentage point higher than that of Multiple Birth Support Vector Machine (MBSVM) and 0.74 percentage point higher than that of Robust Support Vector Machine based on Rescaled Hinge loss function (RSVM-RHHQ); in the case of having outliers in the datasets, the accuracy of the proposed method is 2.10 percentage point higher than that of MBSVM and 1.47 percentage point higher than that of RSVM-RHHQ. Experimental results verify the robustness of the proposed method in solving multi-classification problems with outliers.
Reference
|
Related Articles
|
Metrics
Select
Incremental robust non-negative matrix factorization with sparseness constraints and its application
YANG Liangdong, YANG Zhixia
Journal of Computer Applications 2019, 39 (
5
): 1275-1281. DOI:
10.11772/j.issn.1001-9081.2018092032
Abstract
(
765
)
PDF
(988KB)(
396
)
Knowledge map
Save
Aiming at the problem that the operation scale of Robust Non-negative Matrix Factorization (RNMF) increases with the number of training samples, an incremental robust non-negative matrix factorization algorithm with sparseness constraints was proposed. Firstly, robust non-negative matrix factorization was performed on initial data. Then, the factorized result participated in the subsequent iterative operation. Finally, with sparseness constraints, the coefficient matrix was combined with incremental learning, which made the objective function value fall faster in the iterative solution. The cost of computation was reduced and the sparseness of data after factorization was improved. In the numerical experiments, the proposed algorithm was compared with RNMF algorithm and RNMF with Sparseness Constraints (RNMFSC) algorithm. The experimental results on ORL and YALE face databases show that the proposed algorithm is superior to the other two algorithms in terms of operation time and sparseness of factorized data, and has better clustering effect, especially in YALE face database, when the clustering number is 3, the clustering accuracy of the proposed algorithm reaches 91.67%.
Reference
|
Related Articles
|
Metrics